Tight risk bounds for multi-class margin classifiers
نویسندگان
چکیده
منابع مشابه
Risk bounds for CART classifiers under a margin condition
Non asymptotic risk bounds for Classification And Regression Trees (CART) classifiers are obtained in the binary supervised classification framework under a margin assumption on the joint distribution of the covariates and the labels. These risk bounds are derived conditionally on the construction of the maximal binary tree and allow to prove that the linear penalty used in the CART pruning alg...
متن کاملtight frame approximation for multi-frames and super-frames
در این پایان نامه یک مولد برای چند قاب یا ابر قاب تولید شده تحت عمل نمایش یکانی تصویر برای گروه های شمارش پذیر گسسته بررسی خواهد شد. مثال هایی از این قاب ها چند قاب های گابور، ابرقاب های گابور و قاب هایی برای زیرفضاهای انتقال پایاست. نشان می دهیم که مولد چند قاب تنک نرمال شده (ابرقاب) یکتا وجود دارد به طوری که مینیمم فاصله را از ان دارد. همچنین مسایل مشابه برای قاب های دوگان مطرح شده و برخی ...
15 صفحه اولThe Margin Vector, Admissible Loss and Multi-class Margin-based Classifiers
We propose a new framework to construct the margin-based classifiers, in which the binary and multicategory classification problems are solved by the same principle; namely, margin-based classification via regularized empirical risk minimization. To build the framework, we propose the margin vector which is the multi-class generalization of the margin, then we further generalize the concept of ...
متن کاملMargin Adaptive Risk Bounds for Classification Trees
Margin adaptive risk bounds for Classification and Regression Trees (CART, Breiman et. al. 1984) classifiers are obtained in the binary supervised classification framework. These risk bounds are obtained conditionally on the construction of the maximal deep binary tree and permit to prove that the linear penalty used in the CART pruning algorithm is valid under margin condition. It is also show...
متن کاملTight Bounds for the Expected Risk of Linear Classifiers and PAC-Bayes Finite-Sample Guarantees
We analyze the expected risk of linear classifiers for a fixed weight vector in the “minimax” setting. That is, we analyze the worst-case risk among all data distributions with a given mean and covariance. We provide a simpler proof of the tight polynomial-tail bound for general random variables. For sub-Gaussian random variables, we derive a novel tight exponentialtail bound. We also provide n...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pattern Recognition and Image Analysis
سال: 2016
ISSN: 1054-6618,1555-6212
DOI: 10.1134/s105466181604009x